29 research outputs found

    HyperS tableaux - heuristic hyper tableaux

    Get PDF
    Several syntactic methods have been constructed to automate theorem proving in first-order logic. The positive (negative) hyper-resolution and the clause tableaux were combined in a single calculus called hyper tableaux in [1]. In this paper we propose a new calculus called hyperS tableaux which overcomes substantial drawbacks of hyper tableaux. Contrast to hyper tableaux, hyperS tableaux are entirely automated and heuristic. We prove the soundness and the completeness of hyperS tableaux. HyperS tableaux are applied in the theorem prover Sofia, which additionally provides useful tools for clause set generation (based on justificational tableaux) and for tableau simplification (based on redundancy), and advantageous heuristics as well. An additional feature is the support of the so-called parametrized theorems, which makes the prover able to give compound answers

    Introducing general redundancy criteria for clausal tableaux, and proposing resolution tableaux

    Get PDF
    Hyper tableau calculi are well-known as attempts to combine hyper-res- olution and tableaux. Besides their soundness and completeness, it is also important to give an appropriate redundancy criterion. The task of such a criterion is to filter out “unnecessary” clauses being attached to a given tableau. This is why we investigate what redundancy criteria can be defined for clausal tableaux, in general. This investigation leaded us to a general idea for combining resolution calculi and tableaux. The goal is the same as in the case of hyper-tableau calculi: to split (hyper-)resolution derivations into branches. We propose a novel method called resolution tableaux. Resolution tableaux are more general than hyper tableaux, since any resolution calculus (not only hyperresolution) can be applied, like, e.g., binary resolution, input resolution, or lock resolution etc. We prove that any resolution tableau calculus inherits the soundness and the completeness of the resolution calculus which is being applied. Hence, resolution tableaux can be regarded as a kind of parallelization of resolution

    Portfolio solver for verifying Binarized Neural Networks

    Get PDF
    Although deep learning is a very successful AI technology, many concerns have been raised about to what extent the decisions making process of deep neural networks can be trusted. Verifying of properties of neural networks such as adversarial robustness and network equivalence sheds light on the trustiness of such systems. We focus on an important family of deep neural networks, the Binarized Neural Networks (BNNs) that are useful in resourceconstrained environments, like embedded devices. We introduce our portfolio solver that is able to encode BNN properties for SAT, SMT, and MIP solvers and run them in parallel, in a portfolio setting. In the paper we propose all the corresponding encodings of different types of BNN layers as well as BNN properties into SAT, SMT, cardinality constrains, and pseudo-Boolean constraints. Our experimental results demonstrate that our solver is capable of verifying adversarial robustness of medium-sized BNNs in reasonable time and seems to scale for larger BNNs. We also report on experiments on network equivalence with promising results

    Simplifying the propositional satisfiability problem by sub-model propagation

    Get PDF
    We describes cases when we can simplify a general SAT problem instance by sub-model propagation. Assume that we test our input clause set whether it is blocked or not, because we know that a blocked clause set can be solved in polynomial time. If the input clause set is not blocked, but some clauses are blocked, then what can we do? Can we use the blocked clauses to simplify the clause set? The Blocked Clear Clause Rule and the Independent Blocked Clause Rule describe cases when the answer is yes. The other two independent clause rules, the Independent Nondecisive- and Independent Strongly Nondecisive Clause Rules describe cases when we can use nondecisive and strongly nondecisive clauses to simplify a general SAT problem instance

    Portfolio solver for verifying Binarized Neural Networks

    Get PDF
    Although deep learning is a very successful AI technology, many concerns have been raised about to what extent the decisions making process of deep neural networks can be trusted. Verifying of properties of neural networks such as adversarial robustness and network equivalence sheds light on the trustiness of such systems. We focus on an important family of deep neural networks, the Binarized Neural Networks (BNNs) that are useful in resourceconstrained environments, like embedded devices. We introduce our portfolio solver that is able to encode BNN properties for SAT, SMT, and MIP solvers and run them in parallel, in a portfolio setting. In the paper we propose all the corresponding encodings of different types of BNN layers as well as BNN properties into SAT, SMT, cardinality constrains, and pseudo-Boolean constraints. Our experimental results demonstrate that our solver is capable of verifying adversarial robustness of medium-sized BNNs in reasonable time and seems to scale for larger BNNs. We also report on experiments on network equivalence with promising results

    Formal verification for quantized neural networks

    Get PDF
    Despite of deep neural networks are being successfully used in many fields of computing, it is still challenging to verify their trustiness. Previously it has been shown that binarized neural networks can be verified by being encoded into Boolean constraints. In this paper, we generalize this encoding to quantized neural networks (QNNs). We demonstrate how to implement QNNs in Python, using the Tensorflow and Keras libraries. Also, we demonstrate how to implement a Boolean encoding of QNNs, as part of our tool that is able to run a variety of solvers to verify QNNs
    corecore